/blog/archive/
Publishing a Dynamic Website Statically
So I’m not quite ready to fully host this website as a dynamic web server on the net, so I thought I’d review how I host it on the web anyway.
AWS Setup Steps
- Setup domain name in Route53
- Request an TLS certificate through ACM
- Verify the TLS request in Route53
- Create S3 bucket with no access (NOTE! Do not choose the host as website option!).
- Create a CloudFront distribution with OAI for the S3 bucket, our TLS cert, set HTTP to redirect to HTTPS, and use the alias/CNAME name as our domain name.
- Create the A record, choose the sub-option of using an AWS ALIAS and point it to our CloudFront distribution.
- Create 2 IAM policies: first, for full access to just our S3 bucket (restrict by ARN); second, give invalidate only access to just our CloudFront distribution by ARN(why couldn’t Amazon just call them Arrghs? ARN is so boring).
- Create a role in IAM and attach the two policies.
- Create a group and attach the role.
- Create a user, place them in the group, download CLI credentials and setup the local ~/.aws/credentials file.
Yeah, that’s the quick and dirty of the AWS bits. If you can noodle through those, then you can write a publishing script like so. This script relies heavily on having wget and aws-cli installed.
publish.bash
#!/usr/bin/env bash
# set -euo pipefail
echo "Begin stage 1";
if [ -d gatewaynode.com ]; then
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent http://locaLhost:8000;
else
echo "gatewaynode.com dir not found, exiting";
exit;
fi
echo "Begin stage 2";
if [ -d ’localhost:8000’ ]; then
rm -rvf gatewaynode.com/*
mv -vf localhost\:8000/* gatewaynode.com/;
rm -rvf localhost:8000;
else
echo "Mirroring failed";
exit;
fi
echo "Begin stage 3";
if [ -f ~/.aws/credentials ]; then
aws s3 sync gatewaynode.com/. s3://gatewaynode;
aws cloudfront create-invalidation --distribution-id SOMEDISTNUMBER --paths "/*";
else
echo "AWS credentials not found, sync and invalidate aborted";
exit;
fi
That’s the quick and dirty of it. Takes about 10 seconds to mirrow the site and push it to S3 and however long CloudFront is going to take today to invalidate the cache (usually less than 10 minutes, but this varies wildly). So I almost have this where I want it to be, a command line CMS for making entries and managing the content, a web server for generating the site, a publishing workflow for pushing from local to the net.
NOTE: I commented out the "set -euo pipefail" as parts of the wget will almost always fail causing a script exit (unless I create an actual favicon.ico, robots.txt and clean up those junk links I added for testing...).